# Device-side optimization
Mobilellm 125M
MobileLLM is Meta's series of sub-billion parameter language models specifically optimized for resource-constrained devices, significantly improving on-device inference efficiency through deep-narrow architecture design.
Large Language Model
Transformers

M
facebook
1,675
111
Zamba2 2.7B
Apache-2.0
Zamba2-2.7B is a hybrid model composed of state space and Transformer modules, using the Mamba2 module and shared attention module, featuring high performance and low latency.
Large Language Model
Transformers

Z
Zyphra
2,550
77
Mobillama 1B Chat
Apache-2.0
MobiLlama-1B-Chat is an instruction-following model fine-tuned from MobiLlama-1B, specifically designed for resource-constrained devices, emphasizing efficiency, low memory footprint, and fast response.
Large Language Model
Transformers English

M
MBZUAI
44
25
Blockchainlabs 7B Merged Test2 4 Prune Sft 4bit DPO Orca
This is a small 7B-parameter LLM optimized for device-side use, pruned and trained with DPO
Large Language Model
Transformers English

B
alnrg2arg
18
2
Wav2vec2 Large Superb Ks
Apache-2.0
A speech classification model fine-tuned on the SUPERB keyword spotting task, based on the Wav2Vec2-Large-LV60 pre-trained model
Speech Recognition
Transformers English

W
superb
18
1
Featured Recommended AI Models